LPBoost with Strong Classifiers

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropy Regularized LPBoost

In this paper we discuss boosting algorithms that maximize the soft margin of the produced linear combination of base hypotheses. LPBoost is the most straightforward boosting algorithm for doing this. It maximizes the soft margin by solving a linear programming problem. While it performs well on natural data, there are cases where the number of iterations is linear in the number of examples ins...

متن کامل

Towards a Theory of Strong Overgeneral Classifiers

We analyse the concept of strong overgeneral rules, the Achilles’ heel of traditional Michigan-style learning classifier systems, using both the traditional strength-based and newer accuracy-based approaches to rule fitness. We argue that different definitions of overgenerality are needed to match the goals of the two approaches, present minimal conditions and environments which will support st...

متن کامل

Reasoning with Classifiers

Research in machine learning concentrates on the study of learning single concepts from examples. In this framework the learner attempts to learn a single hidden function from a collection of examples, assumed to be drawn independently from some unknown probability distribution. However, in many cases – as in most natural language and visual processing situations – decisions depend on the outco...

متن کامل

Classifiers with limited connectivity

For many neural network models that are based on perceptrons, the number of activity patterns that can be classified is limited by the number of plastic connections that each neuron receives, even when the total number of neurons is much larger. This poses the problem of how the biological brain can take advantage of its huge number of neurons given that the connectivity is extremely sparse, es...

متن کامل

Improved Boosting Algorithm Using Combined Weak Classifiers

From family of corrective boosting algorithms (i.e. AdaBoost, LogitBoost) to total corrective algorithms (i.e. LPBoost, TotalBoost, SoftBoost, ERLPBoost), we analysis these methods of sample weight updating. Corrective boosting algorithms update the sample weight according to the last hypothesis; comparatively, total corrective algorithms update the weight with the best one of all weak classifi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computational Intelligence Systems

سال: 2010

ISSN: 1875-6883

DOI: 10.2991/ijcis.2010.3.s1.7